The Relation of Speech and Gestures: Temporal Synchrony Follows Semantic Synchrony
نویسندگان
چکیده
The close relationship of speech and gestures becomes conspicuously obvious in the temporal coordination of both modalities. In this paper we investigate in how far temporal synchrony is affected by the semantic relationship of gestures and their lexical affiliates. The results showed that when both modalities redundantly express the same information, the gesture’s onset is closer to that of the accompanying lexical affiliate than when gestures convey complementary information: the closer speech and gestures are related semantically, the closer is their temporal relation. This novel finding is discussed with respect to implications for the production process of speech and gestures.
منابع مشابه
A Grammar for Language and Co-verbal Gesture
Meaning in everyday communication is conveyed by various signals including spoken utterances and spontaneous hand gestures. The literature has attested that gestures function in synchrony with speech to deliver an integrated message, or a “single thought” [5], [1], exhibit language-specific properties [2] and are subject to formal semantic modeling [3]. One of the challenges in modeling synchro...
متن کاملHearing and seeing meaning in speech and gesture: insights from brain and behaviour.
As we speak, we use not only the arbitrary form-meaning mappings of the speech channel but also motivated form-meaning correspondences, i.e. iconic gestures that accompany speech (e.g. inverted V-shaped hand wiggling across gesture space to demonstrate walking). This article reviews what we know about processing of semantic information from speech and iconic gestures in spoken languages during ...
متن کاملAn Information Theoretic Analysis of the Temporal Synchrony Between Head Gestures and Prosodic Patterns in Spontaneous Speech
We analyze the temporal co-ordination between head gestures and prosodic patterns in spontaneous speech in a data-driven manner. For this study, we consider head motion and speech data from 24 subjects while they tell a fixed set of five stories. The head motion, captured using a motion capture system, is converted to Euler angles and translations in X, Y and Z-directions to represent head gest...
متن کاملA study of multimodal motherese: the role of temporal synchrony between verbal labels and gestures.
This study examined European American and Hispanic American mothers' multimodal communication to their infants (N = 24). The infants were from three age groups representing three levels of lexical-mapping development: prelexical (5 to 8 months), early-lexical (9 to 17 months), and advanced-lexical (21 to 30 months). Mothers taught their infants four target (novel) words by using distinct object...
متن کاملModel-based Animation of Coverbal Gesture
Virtual conversational agents are supposed to combine speech with nonverbal modalities for intelligible and believeable utterances. However, the automatic synthesis of coverbal gestures still struggles with several problems like naturalness in procedurally generated animations, flexibility in pre-defined movements, and synchronization with speech. In this paper, we focus on generating complex m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011